Vapnik-Chervonenkis Dimension of Neural Nets
نویسندگان
چکیده
منابع مشابه
Contextual Modeling Using Context-Dependent Feedforward Neural Nets
The paper addresses the problem of using contextual information by neural nets solving problems of contextual nature. The models of a context-dependent neuron and a multi-layer net are recalled and supplemented by the analysis of context-dependent and hybrid nets’ architecture. The context-dependent nets’ properties are discussed and compared with the properties of traditional nets considering ...
متن کاملEfficient Learning of Contextual Mappings by Context-Dependent Neural Nets
The paper addresses the problem of using contextual information by neural nets solving problems of contextual nature. The model of a context-dependent neuron is unique in the fact that allows weights to change according to some contextual variables even after the learning process has been completed. The structures of context-dependent neural nets are outlined, the Vapnik-Chervonenkis dimension ...
متن کاملQuantifying Generalization in Linearly Weighted Neural Networks
Abst ract . Th e Vapn ik-Chervonenkis dimension has proven to be of great use in the theoret ical study of generalizat ion in artificial neural networks. Th e "probably approximately correct" learning framework is described and the importance of the Vapnik-Chervonenkis dimension is illustrated. We then investigate the Vapnik-Chervonenkis dimension of certain types of linearly weighted neural ne...
متن کاملVC Dimension of Neural Networks
This paper presents a brief introduction to Vapnik-Chervonenkis (VC) dimension, a quantity which characterizes the difficulty of distribution-independent learning. The paper establishes various elementary results, and discusses how to estimate the VC dimension in several examples of interest in neural network theory.
متن کاملVapnik-Chervonenkis Dimension of Recurrent Neural Networks
Most of the work on the Vapnik-Chervonenkis dimension of neural networks has been focused on feedforward networks. However, recurrent networks are also widely used in learning applications, in particular when time is a relevant parameter. This paper provides lower and upper bounds for the VC dimension of such networks. Several types of activation functions are discussed, including threshold, po...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 1995